A Riemannian conjugate gradient method for optimization on the Stiefel manifold

نویسنده

  • Xiaojing Zhu
چکیده

In this paper we propose a new Riemannian conjugate gradient method for optimization on the Stiefel manifold. We introduce two novel vector transports associated with the retraction constructed by the Cayley transform. Both of them satisfy the Ring-Wirth nonexpansive condition, which is fundamental for convergence analysis of Riemannian conjugate gradient methods, and one of them is also isometric. It is known that the RingWirth nonexpansive condition does not hold for traditional vector transports as the differentiated retractions of QR and polar decompositions. Practical formulae of the new vector transports for low-rank matrices are obtained. Dai’s nonmonotone conjugate gradient method is generalized to the Riemannian case and global convergence of the new algorithm is established under standard assumptions. Numerical results on a variety of low-rank test problems demonstrate the effectiveness of the new method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Motion Estimation in Computer Vision: Optimization on Stiefel Manifolds

Motion recovery from image correspondences is typically a problem of optimizing an objective function associated with the epipolar (or LonguetHiggins) constraint. This objective function is defined on the so called essential manifold. In this paper, the intrinsic Riemannian structure of the essential manifold is thoroughly studied. Based on existing optimization techniques on Riemannian manifol...

متن کامل

A Neural Stiefel Learning based on Geodesics Revisited

In this paper we present an unsupervised learning algorithm of neural networks with p inputs and m outputs whose weight vectors have orthonormal constraints. In this setting the learning algorithm can be regarded as optimization posed on the Stiefel manifold, and we generalize the natural gradient method to this case based on geodesics. By exploiting its geometric property as a quotient space: ...

متن کامل

Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property

Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...

متن کامل

Low-rank tensor completion: a Riemannian manifold preconditioning approach

We propose a novel Riemannian manifold preconditioning approach for the tensor completion problem with rank constraint. A novel Riemannian metric or inner product is proposed that exploits the least-squares structure of the cost function and takes into account the structured symmetry that exists in Tucker decomposition. The specific metric allows to use the versatile framework of Riemannian opt...

متن کامل

Statistics on the (compact) Stiefel manifold: Theory and Applications

A Stiefel manifold of the compact type is often encountered in many fields of Engineering including, signal and image processing, machine learning, numerical optimization and others. The Stiefel manifold is a Riemannian homogeneous space but not a symmetric space. In previous work, researchers have defined probability distributions on symmetric spaces and performed statistical analysis of data ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 67  شماره 

صفحات  -

تاریخ انتشار 2017